PROBLEM-SOLVING IN HEALTH CARING FORM DATA SYSTEM

VU HIEU NGUYEN

SOFTWARE ENGINEER

Back

#PROBLEM-SOLVING IN HEALTH CARING FORM DATA SYSTEM

Avatar
Hieu Nguyen Vu - Oct 15, 2016  ·  6 min read

Overview about this project

For each time patients come to the health care facility, nurses have to take a survey to collect information on patients. In the medical examination, the doctors ask the patient about diseases, living conditions, habits, etc ... All these data are collected and help the patient for the follow-up visit. Overview, the government can collect medical data of, residential area, a province, and make a medical decision.

Store data in the client-side (browser-side)

One of my challenges is to store the patient data on the browser side. This means data store in the device of the care facility. We support storing data of amount 5000 users for a care facility, includes all sensitive data that can disclose patient information. => We using IndexedDB to store this large data.

On a new browser like Chrome, users still can view sensitive data without login.

=> AES 256 is a good encoding solution for this case. When the user login and needs to view data, we decrypt it and display it on the screen.

If sensitive data was stored in the browser, how we can consistently all the data in all devices of a care facility?

=> We have a function backup/restore, allow user export all sensitive data were encrypted, and import to another device. => I have to validate for this file, validate encoding, encrypt, data, version, etc...

Import function

To support digital transformation, this system supports a function that allowed the user to import data of patients, and form data as a CSV file. Import the Form data is really a challenge for me, cause it up to:

  1. 30 types, each type has a specific model.
  2. 200 field data for a model.
  3. The maximum is 50Mb per file.
  4. 20 files for the one-time import file.

I. Validate for file data before sending to server

How did I solve it? Split the task. I split the validation into four sub task.

  1. Validate file (encoding, file type, data type, number of columns, column required, authority...).
  2. Validate fields of each record.
  3. Validate the relation of each field in a record (group value, required in a set, the value depending on another field...).
  4. Call API to validate some specific field in the server.

II. Send to server

  1. This Function using the Azure durable function for the back-end. That means I have to send a request to send data, send another request continuously until getting successful status (the back-end import completed).
  2. Bussiness logic allowed users to import multi-file with different data types at the same time, and some special logic required some file import before some file or not import at the same time with some file. Required control many async tasks at the same time and keep business logic, save time for user.

Look like not too challenging, right? Challenge comes when we combine with [1]-This Function using the Azure durable function.

IE11 / EDGE (old edge) browser support

IE11 and Edge are terrible, but Japanese people still using them, that's why we still support them.

  1. They are 32 bit. It's mean the memory of it stuck when reach 1,7Gb -> 1.8Gb and auto-reload (or asking to reload)
  2. They don't share thread memory for each tab. It means all the tab of this browser, service worker, etc.... are using ~1,7Gb only.
  3. CSS support is outdated. Some JS functions do not work (but some of them can solve by polyfill).

0
0